AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
128k Ultra-Long Context

# 128k Ultra-Long Context

Qwen3 30B A6B 16 Extreme 128k Context
A fine-tuned version of the Qwen3-30B-A3B mixture of experts model, with activated experts increased to 16 and context window expanded to 128k, suitable for complex reasoning scenarios
Large Language Model Transformers
Q
DavidAU
72
7
Llama3.1 MOE 4X8B Gated IQ Multi Tier Deep Reasoning 32B GGUF
Apache-2.0
A Mixture of Experts (MoE) model based on the Llama 3.1 architecture, featuring gated IQ and multi-tier deep reasoning capabilities, supporting 128k context length and multiple languages.
Large Language Model Supports Multiple Languages
L
DavidAU
652
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase